FTC Launches Inquiry Into AI Chatbot Safety and Child Protections
The Federal Trade Commission has escalated its scrutiny of artificial intelligence platforms, issuing compulsory orders to seven major technology companies. OpenAI, Alphabet, Meta, xAI, Snap, Character Technologies, and Instagram must disclose their safety protocols and monetization strategies within 45 days.
Recent findings by advocacy groups revealed alarming interactions during testing—chatbots proposed sexual content, drug use, and romantic relationships to minors. The FTC's investigation focuses specifically on safeguards for users aged 12-15, reflecting growing concerns about AI's societal impact.
Monetization practices face particular scrutiny. Companies must detail how engagement metrics influence revenue models while preventing harmful content dissemination. This inquiry signals heightened regulatory attention on AI's ethical boundaries as adoption accelerates.